UNIVERSALITY OF NEURAL NETWORKS WITH A SIGMOIDAL ACTIVATION OR DISCRIMINATORY FUNCTIONS ON FUNCTIONAL BANACH LATTICES OVER THE REAL LINE
نویسندگان
چکیده
The universality of neural networks with sigmoidal or discriminatory functions are characterized on functional Banach lattices. This study is an extension a previous similar problem unbounded exponent in variable Lebesgue spaces, which contain essentially bounded measurable functions. However, the notion lattices includes other fundamental function spaces and our result natural generalization well-known by Cybenko. We examine structure uniformly continuous for this purpose provide over domains.
منابع مشابه
The Dynamic Universality of Sigmoidal Neural Networks
We investigate the computational power of recurrent neural networks that apply the sigmoid activation function _(x)=[2 (1+e)]&1. These networks are extensively used in automatic learning of non-linear dynamical behavior. We show that in the noiseless model, there exists a universal architecture that can be used to compute any recursive (Turing) function. This is the first result of its kind for...
متن کاملRidge Functions, Sigmoidal Functions and Neural Networks
This paper considers mainly approximation by ridge functions. Fix a point a 2 IR n and a function g : IR ! IR. Then the function f : IR n ! IR deened by f (x) = g(ax), x 2 IR n , is a ridge or plane wave function. A sigmoidal function is a particular example of the function g which closely resembles 1 at 1 and 0 at ?1. This paper discusses approximation problems involving general ridge function...
متن کاملthe effect of functional/notional approach on the proficiency level of efl learners and its evaluation through functional test
in fact, this study focused on the following questions: 1. is there any difference between the effect of functional/notional approach and the structural approaches to language teaching on the proficiency test of efl learners? 2. can a rather innovative language test referred to as "functional test" ge devised so so to measure the proficiency test of efl learners, and thus be as much reliable an...
15 صفحه اولLower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous functions. In particular, we show that for the approximation of polynomials the network size has to grow as O((logk)1/4) where k is the degree of the polynomials. This bound is valid for any input dimension, i.e. independently of the number of variables. The result is obtained by introducing a new met...
متن کاملLatticeRnn: Recurrent Neural Networks Over Lattices
We present a new model called LATTICERNN, which generalizes recurrent neural networks (RNNs) to process weighted lattices as input, instead of sequences. A LATTICERNN can encode the complete structure of a lattice into a dense representation, which makes it suitable to a variety of problems, including rescoring, classifying, parsing, or translating lattices using deep neural networks (DNNs). In...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Mathematical Sciences
سال: 2022
ISSN: ['1072-3374', '1573-8795']
DOI: https://doi.org/10.1007/s10958-022-05889-7